Gaussian Optimality for Derivatives of Differential Entropy Using Linear Matrix Inequalities

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Gaussian Optimality for Derivatives of Differential Entropy Using Linear Matrix Inequalities

Let Z be a standard Gaussian random variable, X be independent of Z, and t be a strictly positive scalar. For the derivatives in t of the differential entropy of X + √ tZ, McKean noticed that Gaussian X achieves the extreme for the first and second derivatives, among distributions with a fixed variance, and he conjectured that this holds for general orders of derivatives. This conjecture implie...

متن کامل

The Rate of Entropy for Gaussian Processes

In this paper, we show that in order to obtain the Tsallis entropy rate for stochastic processes, we can use the limit of conditional entropy, as it was done for the case of Shannon and Renyi entropy rates. Using that we can obtain Tsallis entropy rate for stationary Gaussian processes. Finally, we derive the relation between Renyi, Shannon and Tsallis entropy rates for stationary Gaussian proc...

متن کامل

Controller Design Using Linear Matrix Inequalities

2.3. H∞ Performance 3. Controller Design Using Linear Matrix Inequalities 3.1. Linearizing Change of Variables – State Feedback 3.2. Linearizing Change of Variables Output Feedback 3.3. LMI Approach to Multiobjective Design 3.4. Existence of Solutions and Conservatism of Design 4. Illustrative Design Example: Robust Control of a Power System Stabilizer 4.1. Problem Description 4.2. Design Speci...

متن کامل

Using operational matrix for numerical solution of fractional differential equations

In this article, we have discussed a new application of modification of hat functions on nonlinear multi-order fractional differential equations. The operational matrix of fractional integration is derived and used to transform the main equation to a system of algebraic equations. The method provides the solution in the form of a rapidly convergent series. Furthermore, error analysis of the pro...

متن کامل

Gaussian mixtures: entropy and geometric inequalities

A symmetric random variable is called a Gaussian mixture if it has the same distribution as the product of two independent random variables, one being positive and the other a standard Gaussian random variable. Examples of Gaussian mixtures include random variables with densities proportional to e−|t| p and symmetric p-stable random variables, where p ∈ (0, 2]. We obtain various sharp moment an...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Entropy

سال: 2018

ISSN: 1099-4300

DOI: 10.3390/e20030182